Skip to content

Ap loss#1

Merged
7 commits merged intomasterfrom
ap_loss
Apr 2, 2020
Merged

Ap loss#1
7 commits merged intomasterfrom
ap_loss

Conversation

@ghost
Copy link

@ghost ghost commented Apr 1, 2020

Add AP loss classes and Sampler class for sampling non-correspondences.
Add support for loss choice in training class

@ghost ghost requested a review from swiatkowski April 1, 2020 20:20
Copy link
Owner

@swiatkowski swiatkowski left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please address my comments, but looks good in general!

else:
raise ValueError("Couldn't find your loss_function: " + self._config['loss_function']['name'])

loss_function.cuda()
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this needed?


for epoch in range(50): # loop over the dataset multiple times

print "Epoch {}/50".format(epoch)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Define 50 as variable?

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Up :)

image_a_depth_numpy = np.asarray(image_a_depth)
image_b_depth_numpy = np.asarray(image_b_depth)

image_a_mask_numpy = np.asarray(image_a_mask)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't this be behind some flag? Like loss==controstive-loss etc?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a workaround of this issue: RobotLocomotion/pytorch-dense-correspondence#204
Short story: few masks files are empty. If so skip this training sample.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@gasiortomasz What I meant on Slack is that we should add a comment which you wrote here, but in the code above this section of the code.

@@ -0,0 +1,139 @@
import numpy as np
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lets add a comment on top of this file where this code comes from. This will give more context to the readers and might help with debugging. Also, please clarify what is directly taken from the R2D2 code and what you needed to change/add.

@ghost ghost merged commit a02fa0c into master Apr 2, 2020
image_a_depth_numpy = np.asarray(image_a_depth)
image_b_depth_numpy = np.asarray(image_b_depth)

image_a_mask_numpy = np.asarray(image_a_mask)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@gasiortomasz What I meant on Slack is that we should add a comment which you wrote here, but in the code above this section of the code.

import torch.nn as nn
import torch.nn.functional as F

# this class is taken from https://github.com/naver/r2d2/blob/master/nets/ap_loss.py
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@gasiortomasz Usually you would include such comments at the of the docstring of the class. Here behind the line starting "Note: typically..."

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This applies to all the comments you added here.


for epoch in range(50): # loop over the dataset multiple times

print "Epoch {}/50".format(epoch)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Up :)

@ghost ghost deleted the ap_loss branch April 5, 2020 19:11
This pull request was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant